Mutiple-gradient Descent Algorithm for Multiobjective Optimization
نویسنده
چکیده
The steepest-descent method is a well-known and effective single-objective descent algorithm when the gradient of the objective function is known. Here, we propose a particular generalization of this method to multi-objective optimization by considering the concurrent minimization of n smooth criteria {J i } (i = 1,. .. , n). The novel algorithm is based on the following observation: consider a finite set of vectors {u i } (u i ∈ R N , n ≤ N); in the convex hull of this family, there exists a unique element of minimal norm, say ω ∈ R N ; then, the scalar product of ω with any vector in the convex hull, and in particular, with any u i , is at least equal to ω 2 ≥ 0. Applying this to the objective-function gradients (u i = ∇J i), we conclude that either ω = 0, and the current design point belongs to the Pareto set, or −ω is a descent direction common to all objective functions. We propose to construct a fixed-point iteration in which updates of the element ω are used as successive directions of search. This method converges to a point on the Pareto set. This result applies to both finite-dimensional and functional design spaces. Numerical illustrations have been provided in both cases using either analytical objective functions, or (discretized) functionals in [9] [5]. Here, following [6], a domain-decomposition framework is used to illustrate the necessity, in a (discretized) functional setting, to scale the gradients appropriately.
منابع مشابه
A Note on the Descent Property Theorem for the Hybrid Conjugate Gradient Algorithm CCOMB Proposed by Andrei
In [1] (Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization J. Optimization. Theory Appl. 141 (2009) 249 - 264), an efficient hybrid conjugate gradient algorithm, the CCOMB algorithm is proposed for solving unconstrained optimization problems. However, the proof of Theorem 2.1 in [1] is incorrect due to an erroneous inequality which used to indicate the descent property for the s...
متن کاملA new hybrid conjugate gradient algorithm for unconstrained optimization
In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...
متن کاملAn eigenvalue study on the sufficient descent property of a modified Polak-Ribière-Polyak conjugate gradient method
Based on an eigenvalue analysis, a new proof for the sufficient descent property of the modified Polak-Ribière-Polyak conjugate gradient method proposed by Yu et al. is presented.
متن کاملExplicit gradient information in multiobjective optimization
This paper presents an algorithm that converges to points that satisfy a first order necessary condition of weakly Pareto solutions of multiobjective optimization problems. Hints on how to include second order information are given. Preliminary numerical results are encouraging.
متن کاملA New Hybrid Evolutionary Multiobjective Algorithm Guided by Descent Directions
Hybridization of local search based algorithms with evolutionary algorithms is still an under-explored research area in multiobjective optimization. In this paper, we propose a new multiobjective algorithm based on a local search method. The main idea is to generate new non-dominated solutions by adding a linear combination of descent directions of the objective functions to a parent solution ....
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2012